Skip to content

feat: byok#129

Merged
kah-seng merged 30 commits intostagingfrom
feat/byok
Mar 26, 2026
Merged

feat: byok#129
kah-seng merged 30 commits intostagingfrom
feat/byok

Conversation

@kah-seng
Copy link
Member

@kah-seng kah-seng commented Feb 23, 2026

This PR aims to allow users to use their own OpenAI-compatible endpoints and API keys (including those outside of OpenAI). The implementation is not completely done yet but I am creating this draft PR to get earlier feedback.

Settings

Replaces the OpenAI key input in the settings previously. Users can specify a name, slug, base URL and their API key.
image

Model Selection

Models with user-specified API keys have a "(Custom)" appended to the slug.
image

Questions

  1. If users are allowed to specify their own models, should the disabled models be removed from the frontend in the model selection?
  2. While working on this feature, I also noticed that the backend only stores the model that a conversation was started with. Subsequent changes of the model used are not updated in the database, is this intended? Or should conversations resume from the last used model for that specific conversation?

Todo

  • Input validation
  • Polishing of UI
  • Allow any slugs. Currently slugs have to be chosen from a dropdown list of supported models as listed by the backend
  • Show context window, prices as optional inputs (for user's reference), backend already supports this
  • Further testing with other model providers
  • Merge conflicts

@kah-seng kah-seng self-assigned this Feb 23, 2026
@kah-seng kah-seng added the enhancement New feature or request label Feb 23, 2026
@Junyi-99
Copy link
Member

Junyi-99 commented Mar 4, 2026

@kah-seng Thanks for the draft, a couple quick answers to your questions:

  1. Yes, if users can specify their own models, the frontend should hide disabled models.

  2. Previously, we didn’t support changing models mid-conversation, so the database only stores the initial model. Going forward, we should track the model name for each message request instead of a single conversation-level field.

I’ll help review the code and get this feature shipped as soon as possible.

@kah-seng kah-seng marked this pull request as ready for review March 20, 2026 08:08
Copilot AI review requested due to automatic review settings March 20, 2026 08:08
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR introduces “Bring Your Own Key (BYOK)” support so users can configure their own OpenAI-compatible endpoints/API keys via settings, and have those custom models appear in model selection and be used by the backend when creating chat completions.

Changes:

  • Add CustomModel to user settings (proto + backend model + mapper) and surface custom models via ListSupportedModelsV2.
  • Update the webapp settings UI to CRUD custom models and update the model selector to label them as custom.
  • Update the AI client/config plumbing to route requests through user-provided endpoint/API key when a custom model is selected.

Reviewed changes

Copilot reviewed 18 out of 18 changed files in this pull request and generated 7 comments.

Show a summary per file
File Description
webapp/_webapp/src/views/settings/sections/api-key-settings.tsx New BYOK settings modal + custom model CRUD UI wired into settings store
webapp/_webapp/src/views/chat/footer/toolbar/model-selection.tsx Appends “(Custom)” to custom model subtitles
webapp/_webapp/src/pkg/gen/apiclient/user/v1/user_pb.ts Regenerated TS API types to include CustomModel + Settings.customModels
webapp/_webapp/src/pkg/gen/apiclient/chat/v2/chat_pb.ts Regenerated TS API types to include SupportedModel.isCustom
webapp/_webapp/src/hooks/useLanguageModels.ts Adds isCustom field and filters supported models list
proto/user/v1/user.proto Adds CustomModel and Settings.custom_models
proto/chat/v2/chat.proto Adds SupportedModel.is_custom
pkg/gen/api/user/v1/user.pb.go Regenerated Go API for new user settings fields
pkg/gen/api/chat/v2/chat.pb.go Regenerated Go API for SupportedModel.is_custom
internal/models/user.go Adds backend CustomModel and stores it on user Settings
internal/api/mapper/user.go Maps custom_models between proto and DB model
internal/models/llm_provider.go Adds IsCustomModel to provider config
internal/api/chat/list_supported_models_v2.go Returns user custom models as supported models
internal/api/chat/create_conversation_message_stream_v2.go Selects provider config based on chosen custom model
internal/services/toolkit/client/client_v2.go Uses user-provided endpoint/key for custom models
internal/services/toolkit/client/utils_v2.go Adjusts default params based on custom vs non-custom
internal/services/toolkit/client/completion_v2.go Passes IsCustomModel through to default param selection
internal/services/toolkit/client/get_conversation_title_v2.go Uses conversation model for title generation when custom model is active
Comments suppressed due to low confidence (1)

internal/services/toolkit/client/client_v2.go:67

  • Allowing users to set an arbitrary Endpoint that is then passed to option.WithBaseURL introduces a server-side SSRF vector (the backend will make HTTP requests to any URL the user provides). Add server-side validation/allowlisting for custom endpoints (e.g., require https, block localhost/private IP ranges, and/or restrict to known domains), ideally when persisting settings and before creating the client.
func (a *AIClientV2) GetOpenAIClient(llmConfig *models.LLMProviderConfig) *openai.Client {
	var Endpoint string = llmConfig.Endpoint
	var APIKey string = llmConfig.APIKey

	if !llmConfig.IsCustomModel {
		if Endpoint == "" {
			if APIKey != "" {
				// User provided their own API key, use the OpenAI-compatible endpoint
				Endpoint = a.cfg.OpenAIBaseURL // standard openai base url
			} else {
				// suffix needed for cloudflare gateway
				Endpoint = a.cfg.InferenceBaseURL + "/openrouter"
			}
		}

		if APIKey == "" {
			APIKey = a.cfg.InferenceAPIKey
		}
	}

	opts := []option.RequestOption{
		option.WithAPIKey(APIKey),
		option.WithBaseURL(Endpoint),
	}

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

4ndrelim
4ndrelim previously approved these changes Mar 25, 2026
Copy link
Member

@4ndrelim 4ndrelim left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm. Just need to resolve merge conflicts and some of copilot's comments.
Some parts might be glossing over me and I am not too sure yet, but i will approve for merging into staging and lets test there. I will do a final review before merging to main.

@4ndrelim
Copy link
Member

@kah-seng May need you to review: #136. See if it clashes with your changes.

@kah-seng kah-seng merged commit 54514a8 into staging Mar 26, 2026
1 check passed
@kah-seng kah-seng deleted the feat/byok branch March 26, 2026 16:41
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants